Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 42
Filter
1.
Prev Vet Med ; 211: 105818, 2023 Feb.
Article in English | MEDLINE | ID: mdl-36543068

ABSTRACT

Various case descriptions and scoring systems have been used to define neonatal calf diarrhea (NCD) and diverse diarrhea-related outcomes are reported, which limits direct comparison between studies. Therefore, the objective of this scoping review was to characterize the case definitions used for NCD and describe diarrhea-related outcomes to inform future efforts towards standardization. A literature search identified articles using 3 databases (Medline, CAB Direct, Agricola), along with Google and Google Scholar. This returned 16,854 unique articles, which were then screened for eligibility by two independent reviewers, resulting in 555 being selected for data extraction. Among articles, the study populations included mostly dairy-breed calves (88%; n = 486) while the remainder evaluated beef, crossbred, or dual purpose beef/dairy calves (10%; n = 53), or did not report breed (3%; n = 16). Studies used between 1 and 8 metrics to define NCD, with 933 unique metrics extracted in total. The most common metric was fecal consistency alone (30%; n = 281), or with at least 1 other metric (26%; n = 241). To define diarrhea, fecal consistency was either described qualitatively (e.g., "profuse liquid feces"), or semi-quantitatively, for example using a scoring system that frequently included 4 levels (n = 208). Some NCD case definitions included fecal color, volume, or odor (10%; n = 98), physical exam parameters (8%; n = 79), the duration of abnormal feces (7%; n = 67), the presence of abnormal contents (e.g., blood, 7%; n = 61), farm treatment records (6%; n = 54), fecal dry matter (1%; n = 12), or another metric (4%; n = 41). One or more references were cited for the NCD case definition by 49% of studies (n = 273/555), with the most common references being Larson et al. (1977) (n = 85), and McGuirk (2008) (n = 59). In the 555 included articles, 979 unique diarrhea-related outcomes were found, most commonly a binary categorization of calves having or not having diarrhea (49%; n = 483). Other articles reported statistical outcomes calculated from fecal scores (16%; n = 159), multiple diarrhea severities (10%; n = 95), or the age calves first developed NCD (8%; n = 76). This review characterized substantial heterogeneity among NCD case definitions and diarrhea-related outcomes, which limits interpretation and comparison of studies. Future work is required to develop and validate reporting standards for NCD to optimize knowledge synthesis and support rigorous and ethical calf health research.


Subject(s)
Cattle Diseases , Noncommunicable Diseases , Animals , Cattle , Cattle Diseases/diagnosis , Cattle Diseases/epidemiology , Noncommunicable Diseases/veterinary , Diarrhea/veterinary , Feces , Farms
2.
J Dairy Sci ; 106(1): 703-717, 2023 Jan.
Article in English | MEDLINE | ID: mdl-36333146

ABSTRACT

The primary objective of this study was to compare male and female dairy calf management practices and evaluate risk factors associated with differences in care. Secondary objectives were to understand surplus calf transportation and marketing practices and investigate incentives to motivate calf care improvements. An online survey was distributed to all dairy producers in Ontario (n = 3,367) from November 2020 to March 2021 and Atlantic Canada (n = 557) from April to June 2021. Dairy producers were identified through provincial dairy associations and contacted via e-mail and social media. Descriptive statistics were computed, and a logistic regression model was created to evaluate factors associated with using discrepant feeding practices (i.e., fed less colostrum, fed colostrum later, or fed raw, unsalable milk) for male calves compared with females. The survey had a 7.4% response rate (n = 289/3,924) and was primarily filled out by farm owners (76%). Although colostrum and milk feeding practices were similar between male and female calves, male calves received less milk while still on the dairy farm of origin compared with females. Male calves were also more likely to be fed a higher proportion of raw, unsalable milk. Female producers and those that kept their male calves beyond 10 d of age had lower odds of using poorer feeding practices for male calves. Male calves were mostly sold between 1 and 10 d (64%), primarily through direct sales to a calf-rearing facility (45%), with auctions being the next most common method (35%). A small but notable proportion of producers (18%) agreed that euthanizing male calves is a reasonable alternative when their sale price is very low; however, few producers (13%) reported that financial costs limited their male calf care. The largest proportion (43%) of producers reported that a price premium for more vigorous calves would motivate them to take better care of their male calves. Conversely, only 28% of producers reported that a price discount for calves in poor condition would be motivating. Producers placed importance on the opinion of their calf buyer, their herd veterinarian, and the Canadian Code of Practice for the Care and Handling of Dairy Cattle when considering their calf care practices, and they highly valued practices that promote calf health. Respondents to this survey reported a lower proportion of tiestall barn use and higher milk productivity compared with typical dairy herds in the region, suggesting selection bias for more progressive dairy producers. Nevertheless, our results suggest that dairy producers provide similar care between male and female calves, but some male calves experience challenges due to milk feeding and marketing practices. Feedback from calf buyers along with continued support and guidance from herd veterinarians and the Code of Practice may motivate dairy producers to improve male calf care.


Subject(s)
Colostrum , Dairying , Pregnancy , Cattle , Animals , Male , Female , Dairying/methods , Milk , Farms , Surveys and Questionnaires , Ontario
4.
JDS Commun ; 3(3): 201-206, 2022 May.
Article in English | MEDLINE | ID: mdl-36338813

ABSTRACT

Group housing of preweaning dairy calves is increasing in popularity throughout the dairy industry. However, it can be more difficult to individually monitor calves to identify disease in these group systems. Automated milk feeders (AMF) not only provide producers with the opportunity to increase the milk allowance offered to preweaning calves but they can also monitor individual feeding behaviors that could identify calves at increased risk of disease. The objective of this retrospective case-control study was to determine how feeding behaviors change in preweaning calves leading up to and during a disease bout. This study was conducted between fall 2015 and fall 2016 on 2 commercial dairy farms in Ontario, Canada. Producers' treatment records for respiratory or enteric illness were used to identify cases. Control calves were selected from calves not treated for disease and matched on the days on the AMF. Both farms housed calves in dynamic groups of 9 to 11 calves with an AMF and fed milk replacer. Differences in feeding behaviors, including milk consumption, drinking speed, rewarded visits, unrewarded visits, and total visits to the AMF per day, were analyzed by mixed models accounting for repeated measures. Data were analyzed for the 7 d before, the day of, and 7 d after treatment. A total of 28 cases and 28 control calves (n = 56) were analyzed. Calves with disease consumed significantly less milk than their healthy counterparts, beginning 5 d before disease and until 3 d after disease detection. Sick calves had fewer unrewarded visits starting 3 d before until 2 d after illness detection. Sick calves drank significantly more slowly starting 4 d before illness detection until the day after illness detection compared with healthy controls. No differences were found between cases and controls for rewarded visits. Calves on a high plane of milk nutrition significantly alter feeding behaviors before illness detection. Data from AMF on feeding behaviors may help to detect disease earlier in preweaning dairy calves.

5.
JDS Commun ; 3(1): 72-77, 2022 Jan.
Article in English | MEDLINE | ID: mdl-36340675

ABSTRACT

Antimicrobials should be used prudently in farm animals to prevent the development of resistant bacteria in both humans and animals. The objective of this study was to investigate Canadian dairy producers' practices for antimicrobial use in the treatment of disease in preweaning dairy calves. In-person questionnaires were administered to 144 dairy producers across 5 provinces in Canada between July 2019 and August 2020. Almost all (96%) producers used antimicrobials to treat calves with respiratory disease, but only 27% indicated they had a written treatment protocol for respiratory disease. Most (95%) of these protocols for respiratory disease were developed with input from the herd veterinarian. Seventy-four percent of producers used antimicrobials to treat calf diarrhea, with 37% of producers having a written treatment protocol for calf diarrhea with input from the herd veterinarian. The combinations of signs adopted by the producers for antimicrobial treatment in calf respiratory disease and diarrhea were evaluated based on findings from other studies. More than half (56%) of producers who used antimicrobials for calf respiratory disease decided to use antimicrobials by evaluating multiple clinical signs. Eighty-two percent of producers who used antimicrobials for calf diarrhea made decisions based on systemic signs of disease, presence of bloody stool, no response to previous treatment, or on the recommendation from the herd veterinarian. Producers with a written treatment protocol had 3 to 7 times greater odds of using antimicrobials based on multiple signs or systemic signs of disease compared with those without a protocol. Further research may investigate other calf management practices related to decision-making by producers in using antimicrobials to improve antimicrobial stewardship on dairy farms.

6.
Article in English | MEDLINE | ID: mdl-36294146

ABSTRACT

Farmers in Canada faced higher levels of mental distress than the general public prior to the Coronavirus Disease 2019 (COVID-19) pandemic and are generally less likely than the public to seek help. However, the mental health impacts of COVID-19 on farmers in Canada remain unexplored. Our objective was to investigate mental health outcomes among farmers in Canada by gender and within the context of COVID-19. We conducted a national, online, cross-sectional survey of farmers in Canada (February-May 2021). The survey included validated scales of anxiety, depression, perceived stress, burnout (emotional exhaustion, cynicism, professional efficacy), alcohol use, resilience, and questions regarding participants' perceived changes in these outcomes during the pandemic. Data were also collected on the impact of COVID-19 specific social and economic factors on mental health, help-seeking, and sense of community belonging through the pandemic. Descriptive statistics were summarized, and Chi-square analyses and t-tests were conducted to compare survey results between genders and to data collected in our similar 2016 survey and normative population data. A total of 1167 farmers participated in the survey. Participants scored more severely across scales than scale norms and the general Canadian population during COVID-19. Scale means were consistent between the 2016 and 2021 samples. Most participants with moderate to severe scores for any outcome reported worsening symptoms since the pandemic began. Women fared significantly worse than men across measures. Over twice as many women reported seeking mental health or substance use support during the pandemic than men. Participants rated the mental health impacts of all social and economic factors related to COVID-19 examined significantly (p < 0.05) differently than the Canadian public. The pandemic has negatively impacted the mental health of farmers in Canada and in ways that differ from the general population. National level and gender-specific mental health supports are needed to help improve the mental health of farmers in Canada.


Subject(s)
COVID-19 , Substance-Related Disorders , Female , Humans , Male , COVID-19/epidemiology , Mental Health , Farmers/psychology , Cross-Sectional Studies , Canada/epidemiology , Anxiety/epidemiology , Anxiety/psychology , Depression/epidemiology
7.
J Dairy Sci ; 105(11): 8594-8608, 2022 Nov.
Article in English | MEDLINE | ID: mdl-36055845

ABSTRACT

Clinical trials are a valuable study design for evaluating interventions when it is ethical and feasible for investigators to randomly allocate study animals to intervention groups. Researchers may choose to evaluate the comparative efficacy of intervention groups for their effect on outcomes that are relevant to the specific objectives of their trial. However, the results across multiple trials on the same intervention and with the same outcome should be considered when making decisions on whether to use an intervention, because the results of a single trial are subject to sampling error and do not reflect all biological variability. The objective of this review was to provide an overview of important concepts when selecting intervention groups and outcomes within a randomized controlled trial, and when building a body of evidence for intervention efficacy across multiple trials. Empirical evidence is presented to highlight that integrating and interpreting the efficacy of an intervention across trials is hindered by a lack of replication of interventions across trials. Inconsistency in the outcomes and their measurement among trials also limits the ability to build a body of evidence for the efficacy of interventions. The development of core outcome sets for specific topic areas in dairy science, updated as necessary, may improve consistency across trials and aid in the development of a body of evidence for evidence-based decision-making.


Subject(s)
Clinical Trials, Veterinary as Topic , Research Design , Animals , Cattle
8.
J Dairy Sci ; 105(8): 6809-6819, 2022 Aug.
Article in English | MEDLINE | ID: mdl-35688730

ABSTRACT

This review synthesizes research findings on the pain and welfare of dairy calves undergoing disbudding procedures. We describe disbudding practices in North America as well as the use and perceptions of pain control for these procedures. Governing bodies across Canada and the United States, including each country's veterinary medical association and nationwide initiatives such as proAction and Farmers Assuring Responsible Management (FARM), recommend or require the use of a local anesthetic, a nonsteroidal anti-inflammatory drug (NSAID), and a sedative for disbudding procedures. Although the use of pain relief for disbudding has increased over the past decade or so, some in the dairy industry still do not believe that pain control for disbudding is necessary. As a painful procedure, disbudding has numerous welfare impacts on the calf both during and following the procedure that can be categorized under all 3 principles of animal welfare: natural living, biological functioning, and affective state. The use of pain control for disbudding; namely, a local anesthetic and NSAID, can improve welfare outcomes such as procedure-induced pain behavior, cortisol concentrations, mechanical nociceptive threshold, emotional states, and so on, compared with no pain control for the procedure. Although extensive research exists on pain control practices for disbudding, this review identified further gaps in knowledge and areas for future research. Mechanical nociceptive threshold can be evaluated around the disbudding wounds and is a reliable test in older calves; however, this outcome in very young calves after caustic paste disbudding has been reported to be inconclusive compared with that in older calves. As well, research evaluating xylazine sedation for disbudding has reported both potentially positive and negative results that are difficult to interpret or base suggestions on for the use of this drug. Finally, wounds caused by disbudding take a long time to heal (up to 13 wk) and have increased sensitivity for the entire healing process. Therefore, future research should aim to (1) determine accurate behavioral tests for calves under 1 wk of age undergoing disbudding to better understand their experience, (2) further attempt to understand the effects of xylazine sedation for disbudding and potential impacts of providing this medication, and (3) determine more ways to reduce the healing time and pain experienced by the calf after disbudding procedures.


Subject(s)
Horns , Anesthetics, Local/pharmacology , Animals , Anti-Inflammatory Agents, Non-Steroidal/therapeutic use , Cattle , Horns/surgery , Humans , Pain/drug therapy , Pain/veterinary , Students , Xylazine
9.
Vet Sci ; 9(6)2022 Jun 11.
Article in English | MEDLINE | ID: mdl-35737340

ABSTRACT

The objective of this scoping review was to describe and characterize the existing literature regarding umbilical health and identify gaps in knowledge. Six databases were searched for studies examining umbilical health in an intensively raised cattle population. There were 4249 articles initially identified; from these, 723 full text articles were then screened, with 150 articles included in the review. Studies were conducted in the USA (n = 41), Brazil (n = 24), Canada (n = 13), UK (n = 10), and 37 additional countries. Seventeen were classified as descriptive, 24 were clinical trials, and 109 were analytical observational studies. Umbilical outcomes evaluated in descriptive studies were infection (n = 11), parasitic infection (n = 5), and hernias (n = 2). Of the clinical trials, only one examined treatment of navel infections; the remainder evaluated preventative management factors for navel health outcomes (including infections (n = 17), myiasis (n = 3), measurements (n = 5), hernias (n = 1), and edema (n = 1)). Analytical observational studies examined risk factors for umbilical health (n = 60) and umbilical health as a risk factor (n = 60). Studies examining risk factors for umbilical health included navel health outcomes of infections (n = 28; 11 of which were not further defined), hernias (n = 8), scoring the navel sheath/flap size (n = 16), myiasis (n = 2), and measurements (n = 6). Studies examining umbilical health as a risk factor defined these risk factors as infection (n = 39; of which 13 were not further defined), hernias (n = 8; of which 4 were not further defined), navel dipping (n = 12), navel/sheath scores as part of conformation classification for breeding (n = 2), measurements (n = 3), and umbilical cord drying times (n = 2). This review highlights the areas in need of future umbilical health research such as clinical trials evaluating the efficacy of different treatments for umbilical infection. It also emphasizes the importance for future studies to clearly define umbilical health outcomes of interest, and consider standardization of these measures, including time at risk.

10.
J Dairy Sci ; 105(7): 6083-6093, 2022 Jul.
Article in English | MEDLINE | ID: mdl-35570039

ABSTRACT

The objective of this randomized clinical trial was to evaluate the effectiveness of a single application of 7% iodine tincture-based umbilical dip for preventing infection of the external umbilical structures in dairy calves. Five dairy farms in southern Ontario were visited twice weekly from September 2020 through June 2021. Female (n = 244) and male (n = 40) Holstein calves were randomly assigned at birth to receive either a 7% iodine tincture-based umbilical dip (n = 140) or no treatment (n = 144). Calves in the 7% iodine tincture umbilical dip group had the product administered once by the producer shortly after birth. For each newborn calf, the experimental group, calving difficulty, volume and timing of colostrum administration, time of birth, calving pen cleanliness, and the dam ID were recorded. Calf body weight was recorded during the first visit after birth, and a blood sample was collected for measurement of serum IgG concentration. Calves were health scored twice weekly from enrollment until approximately 30 d of age for assessment of external umbilical infection, joint inflammation, respiratory disease, and diarrhea. The primary outcome of the study was external umbilical infection, which was defined as an enlarged umbilicus with pain, heat, or a malodorous discharge. Calves were also weighed at 30 and 60 d to determine average daily gain. Serum IgG concentration and birth weight did not differ significantly between groups. Twenty-nine calves (20%) in the umbilical dip group developed an external umbilical infections, compared with 31 calves (22%) in the control group. A mixed logistic regression model, accounting for farm as a random effect, showed no effect of treatment on the incidence of an external umbilical infection. However, for every additional hour that calves received colostrum after birth, the odds of developing an external umbilical infection increased during the first month of life (odds ratio = 1.15; 95% confidence interval: 1.04-1.26). Additionally, treatment had no effect on respiratory disease, joint inflammation, diarrhea, average daily gain, or mortality, compared with the untreated control. These findings suggest that administering a single application of 7% iodine tincture dip to the umbilicus around the time of birth may not be effective for preventing external umbilical infections. Farm-level management factors, including colostrum management, appear to have more influence on risk of this disease.


Subject(s)
Iodine , Respiratory Tract Diseases , Animals , Animals, Newborn , Cattle , Colostrum , Diarrhea/veterinary , Female , Immunoglobulin G , Inflammation/veterinary , Male , Pregnancy , Respiratory Tract Diseases/veterinary , Umbilicus
11.
J Dairy Sci ; 105(7): 6220-6239, 2022 Jul.
Article in English | MEDLINE | ID: mdl-35570043

ABSTRACT

The objective of this study was to determine the effect of a biologically normal plane of nutrition compared with a limited plane on the primary outcome wound healing, and one dose of nonsteroidal anti-inflammatory drug (NSAID) compared with 2 on the secondary outcomes: lying behavior, haptoglobin concentrations, and mechanical nociceptive threshold (MNT) in calves disbudded via cautery iron. Eighty female Holstein calves were enrolled at birth, individually housed, and fed via a Calf Rail system (Förster Technik). A 2 × 2 factorial design was used to assess the effect of plane of nutrition and an additional NSAID. Calves were randomly assigned to a biologically normal plane of nutrition (BN; offered up to 15 L/d) or a limited plane (LP; offered up to 6 L/d) and to receive one or 2 doses of meloxicam. All calves received a lidocaine cornual nerve block and a subcutaneous injection of meloxicam 15 min before cautery disbudding at 18 to 25 d of age, and half the calves received an additional injection of meloxicam (0.5 mg/kg) 3 d after disbudding. Tissue type present, wound diameter, and wound depth were evaluated 2 times per week for 7 to 8 wk as measures of wound healing, lying behavior was recorded beginning 1 to 2 wk before disbudding until 7 to 8 wk after as a behavioral indicator of pain, haptoglobin concentrations were measured once per day for 7 d after disbudding, and MNT was evaluated 2 times/wk for 3 wk. Survival analyses were analyzed using Cox regression models (wound healing) and continuous data were analyzed using mixed-effect linear regression models. Only 12% of horn buds were completely healed by 7 to 8 wk after disbudding and 54% had re-epithelized at this time. At any time, wounds from BN calves were more likely to have had re-epithelization occur compared with wounds from LP calves (hazard ratio: 1.93, 95% CI: 1.18-3.14). Wounds from calves that received only one dose of NSAID were more likely to have re-epithelization occur, compared with wounds from calves given 2 doses (hazard ratio: 1.87, 95% CI: 1.15-3.05). Wounds from BN calves had smaller diameters and depths over time beginning on wk 3 compared with LP calves. Wounds from calves that received an additional NSAID had larger diameters and depths over time beginning on wk 4 and 3 respectively, compared with calves that only received one dose of NSAID. Calves that received an extra NSAID tended to be less sensitive 7, 10, and 17 d after disbudding compared with calves that only received one dose and spent less time lying in the week after disbudding. Calves on the BN milk program were more active compared with LP calves with lower lying times, fewer lying bouts per day, and longer average lying bouts. Our results indicate that a BN milk feeding program for calves can result in faster healing times and more activity, and that providing an extra NSAID 3 d after disbudding appears to slow the healing process but may result in less pain experienced by the calf 1 to 2 wk after the procedure. This study is also among the first to demonstrate that after the complete removal of the horn bud, wounds can take more than 8 weeks to re-epithelize and fully heal.


Subject(s)
Horns , Animals , Anti-Inflammatory Agents, Non-Steroidal/therapeutic use , Cattle , Cautery/veterinary , Female , Haptoglobins , Horns/surgery , Meloxicam , Pain/drug therapy , Pain/veterinary , Wound Healing
12.
Animals (Basel) ; 12(8)2022 Apr 09.
Article in English | MEDLINE | ID: mdl-35454220

ABSTRACT

Canadian dairy farmers are required to use a local anesthetic and analgesic prior to all disbudding and dehorning procedures. This study was done to investigate the opinions of Ontario dairy farmers on the use of pain control for disbudding and dehorning calves and their perspectives on the current requirements of the quality assurance program. Interviews were conducted with 29 dairy farmers across Ontario. All participants used a cautery iron to disbud or dehorn their calves and some form of pain control (i.e., NSAID and/or local anesthetic). Of the 29 producers that were interviewed, 22 (76%) were in compliance with the proAction requirements for pain control. Many participants felt positive about the use of pain control for these practices. Education from veterinarians was one of the most commonly listed resources to reduce barriers to pain control use by producers. A farmer's attitude was highly referenced as an influence on producer behaviour. Although participants had positive views of pain control use, full compliance with national quality assurance requirements for disbudding and dehorning was not met by all. Producer education through veterinarians is a potential avenue to encourage the adoption of pain control use for disbudding and dehorning practices.

13.
Can Vet J ; 63(3): 260-268, 2022 03.
Article in English | MEDLINE | ID: mdl-35237012

ABSTRACT

The objectives of this study were to i) describe Escherichia coli and Salmonella isolates; ii) investigate the temporal trends in antimicrobial resistance (AMR) profiles; and iii) evaluate the impact of season and age on these AMR profiles from diagnostic and post-mortem samples in Ontario calves ≤ 2-months-old submitted from 2007 to 2020 to the Animal Health Laboratory in Guelph, Ontario, Canada. Antimicrobial susceptibility testing results were measured by the Kirby-Bauer disk diffusion method. A total of 1291 isolates with AMR profiles were obtained from calves, with E. coli (n = 434) and Salmonella (n = 378) being the most common bacteria characterized for AMR. For E. coli, 79% of isolates tested showed a positive result in F5/K99, whereas for Salmonella isolates, S. Typhimurium (33%) and S. Dublin (22%) were the 2 most common serotypes identified. Multivariable logistic regression models were built to evaluate AMR profiles for E. coli (n = 414) and Salmonella (n = 357) to each antimicrobial tested. Most E. coli isolates (91%) and Salmonella isolates (97%) were resistant to at least one of the antimicrobials tested. In general, E. coli and Salmonella had higher odds of resistance in calves aged ≥ 2 wk compared to 1-week-old calves, and little difference was seen in the level of resistance over the years observed or between seasons in most of the antimicrobials tested. Prospective research should investigate potential risk factors for the development of AMR in calves examples being antimicrobial use and farm management practices.


Étude observationnelle sur la résistance aux antimicrobiens d'isolats d'Escherichia coli et de Salmonella provenant d'échantillons de veaux de l'Ontario soumis à un laboratoire de diagnostic de 2007 à 2020. Les objectifs de cette étude étaient de i) décrire les isolats d'Escherichia coli et de Salmonella; ii) étudier les tendances temporelles des profils de résistance aux antimicrobiens (RAM); et iii) évaluer l'impact de la saison et de l'âge sur ces profils de RAM à partir d'échantillons diagnostiques et post-mortem de veaux de l'Ontario âgés de ≤ 2 mois soumis de 2007 à 2020 au Laboratoire de santé animale de Guelph, Ontario, Canada. Les résultats des tests de sensibilité aux antimicrobiens ont été mesurés par la méthode de diffusion en disque de Kirby-Bauer. Un total de 1291 isolats avec des profils de RAM ont été obtenus à partir de veaux, E. coli (n = 434) et Salmonella (n = 378) étant les bactéries les plus courantes caractérisées pour la RAM. Pour E. coli, 79 % des isolats testés ont montré un résultat positif en F5/K99, alors que pour les isolats de Salmonella, S. Typhimurium (33 %) et S. Dublin (22 %) étaient les deux sérotypes les plus fréquemment identifiés. Des modèles de régression logistique multivariable ont été construits pour évaluer les profils de RAM pour E. coli (n = 414) et Salmonella (n = 357) pour chaque antimicrobien testé. La plupart des isolats d'E. coli (91 %) et des isolats de Salmonella (97 %) étaient résistants à au moins un des antimicrobiens testés. En général, E. coli et Salmonella présentaient un risque de résistance plus élevé chez les veaux âgés de ≥ 2 semaines par rapport aux veaux âgés d'une semaine, et peu de différence a été observée dans le niveau de résistance au cours des années observées ou entre les saisons pour la plupart des antimicrobiens testés. La recherche prospective devrait étudier les facteurs de risque potentiels pour le développement de la RAM chez les veaux, par exemple l'utilisation d'antimicrobiens et les pratiques de gestion de la ferme.(Traduit par Dr Serge Messier).


Subject(s)
Cattle Diseases , Escherichia coli Infections , Animals , Anti-Bacterial Agents/pharmacology , Cattle , Cattle Diseases/epidemiology , Drug Resistance, Bacterial , Escherichia coli , Escherichia coli Infections/drug therapy , Escherichia coli Infections/epidemiology , Escherichia coli Infections/veterinary , Feces/microbiology , Microbial Sensitivity Tests/veterinary , Ontario/epidemiology , Prospective Studies , Salmonella
14.
Animals (Basel) ; 12(2)2022 Jan 11.
Article in English | MEDLINE | ID: mdl-35049793

ABSTRACT

The objective of this case-control study was to determine if feeding behavior data collected from an automated milk feeder (AMF) could be used to predict neonatal calf diarrhea (NCD) in the days surrounding diagnosis in pre-weaned group housed dairy calves. Data were collected from two research farms in Ontario between 2017 and 2020 where calves fed using an AMF were health scored daily and feeding behavior data (milk intake (mL/d), drinking speed (mL/min), number of rewarded or unrewarded visits) was collected. Calves with NCD were pair matched to healthy controls (31 pairs) by farm, gender, and age at case diagnosis to assess for differences in feeding behavior between case and control calves. Calves were first diagnosed with NCD on day 0, and a NCD case was defined as calves with a fecal score of ≥2 for 2 consecutive days, where control calves remained healthy. Repeated measure mixed linear regression models were used to determine if there were differences between case and control calves in their daily AMF feeding behavior data in the days surrounding diagnosis of NCD (-3 to +5 days). Calves with NCD consumed less milk on day 0, day 1, day 3, day 4 and day 5 following diagnosis compared to control calves. Calves with NCD also had fewer rewarded visits to the AMF on day -1, and day 0 compared to control calves. However, while there was a NCD status x day interaction for unrewarded visits, there was only a tendency for differences between NCD and control calves on day 0. In this study, feeding behaviors were not clinically useful to make diagnosis of NCD due to insufficient diagnostic ability. However, feeding behaviors are a useful screening tool for producers to identify calves requiring further attention.

15.
BMC Biol ; 20(1): 15, 2022 01 13.
Article in English | MEDLINE | ID: mdl-35022024

ABSTRACT

BACKGROUND: Over 120 million mice and rats are used annually in research, conventionally housed in shoebox-sized cages that restrict natural behaviours (e.g. nesting and burrowing). This can reduce physical fitness, impair thermoregulation and reduce welfare (e.g. inducing abnormal stereotypic behaviours). In humans, chronic stress has biological costs, increasing disease risks and potentially shortening life. Using a pre-registered protocol ( https://atrium.lib.uoguelph.ca/xmlui/handle/10214/17955 ), this meta-analysis therefore tested the hypothesis that, compared to rodents in 'enriched' housing that better meets their needs, conventional housing increases stress-related morbidity and all-cause mortality. RESULTS: Comprehensive searches (via Ovid, CABI, Web of Science, Proquest and SCOPUS on May 24 2020) yielded 10,094 publications. Screening for inclusion criteria (published in English, using mice or rats and providing 'enrichments' in long-term housing) yielded 214 studies (within 165 articles, using 6495 animals: 59.1% mice; 68.2% male; 31.8% isolation-housed), and data on all-cause mortality plus five experimentally induced stress-sensitive diseases: anxiety, cancer, cardiovascular disease, depression and stroke. The Systematic Review Center for Laboratory animal Experimentation (SYRCLE) tool assessed individual studies' risks of bias. Random-effects meta-analyses supported the hypothesis: conventional housing significantly exacerbated disease severity with medium to large effect sizes: cancer (SMD = 0.71, 95% CI = 0.54-0.88); cardiovascular disease (SMD = 0.72, 95% CI = 0.35-1.09); stroke (SMD = 0.87, 95% CI = 0.59-1.15); signs of anxiety (SMD = 0.91, 95% CI = 0.56-1.25); signs of depression (SMD = 1.24, 95% CI = 0.98-1.49). It also increased mortality rates (hazard ratio = 1.48, 95% CI = 1.25-1.74; relative median survival = 0.91, 95% CI = 0.89-0.94). Meta-regressions indicated that such housing effects were ubiquitous across species and sexes, but could not identify the most impactful improvements to conventional housing. Data variability (assessed via coefficient of variation) was also not increased by 'enriched' housing. CONCLUSIONS: Conventional housing appears sufficiently distressing to compromise rodent health, raising ethical concerns. Results also add to previous work to show that research rodents are typically CRAMPED (cold, rotund, abnormal, male-biased, poorly surviving, enclosed and distressed), raising questions about the validity and generalisability of the data they generate. This research was funded by NSERC, Canada.


Subject(s)
Cardiovascular Diseases , Neoplasms , Stroke , Animals , Female , Housing , Male , Mice , Morbidity , Rats , Rodentia
16.
Zoonoses Public Health ; 69(1): 23-32, 2022 02.
Article in English | MEDLINE | ID: mdl-34476904

ABSTRACT

Campylobacter is the second leading cause of foodborne illness in the United States. Although many food production animals carry Campylobacter as commensal bacteria, consumption of poultry is the main source of human infection. Previous research suggests that the biology of Campylobacter results in complete flock colonization within days. However, a recent systematic review found that the on-farm prevalence of Campylobacter varies widely, with some flocks reporting low prevalence. We hypothesized that the low prevalence of Campylobacter in some flocks may be driven by a delayed introduction of the pathogen. The objectives of this study were to (a) develop a deterministic compartmental model that represents the biology of Campylobacter, (b) identify the parameter values that best represent the natural history of the pathogen in poultry flocks and (c) examine the possibility that a delayed introduction of the pathogen is sufficient to replicate the observed low prevalence examples documented in the literature. A deterministic compartmental model was developed to examine the dynamics of Campylobacter in chicken flocks over a 56-day time period prior to movement to the abattoir. The model outcome of interest was the final population prevalence of Campylobacter at day 56. The resulting model that incorporated a high transmission rate (ß = 1.04) was able to reproduce the wide range of prevalence estimates observed in the literature when pathogen introduction time is varied. Overall, we established that the on-farm transmission rate of Campylobacter in chickens is likely high and can result in complete colonization of a flock when introduced early. However, delaying the time at which the pathogen enters the flock can reduce the prevalence observed at 56 days. These results highlight the importance of enforcing strict biosecurity measures to prevent or delay the introduction of the bacteria to a flock.


Subject(s)
Campylobacter Infections , Campylobacter , Poultry Diseases , Animals , Campylobacter Infections/epidemiology , Campylobacter Infections/microbiology , Campylobacter Infections/veterinary , Chickens/microbiology , North America , Poultry Diseases/microbiology , Prevalence
17.
J Dairy Sci ; 104(11): 11995-12008, 2021 Nov.
Article in English | MEDLINE | ID: mdl-34364646

ABSTRACT

Many dairy farmers in North America disbud or dehorn their cattle to improve human and animal safety. The Farmers Assuring Responsible Management (FARM v. 4.0) program requires that disbudding be performed before 8 wk of age with pain-control medication. The objective of this observational cross-sectional study was to quantify disbudding and dehorning practices of Wisconsin dairy producers to target future extension programming. Responses from 217 Wisconsin dairy producers and calf raisers were collected via digital surveys distributed at extension events and through industry contacts. Of the 217 respondents, 188 performed on-farm disbudding themselves. Most respondents (61%) used caustic paste as their primary method, which was most commonly applied on the day the calf was born (53%). Hot iron was used by 32% of respondents, and surgical methods (gouge, scoop, or wire saw) were used by 6% of respondents. Hot-iron disbudding was most commonly performed at 4 to 8 wk of age (41%) and 1 to 4 wk of age (33%), whereas surgical methods were most commonly performed at 8 wk or older (73%). Pain-control medication was used by 43% of respondents. Specifically, 35% used an anti-inflammatory, and 21% used a local nerve block. Veterinary involvement in creating the disbudding protocol was associated with increased odds of using pain control. Respondents with a target weaning age of ≥10 wk had greater odds of complying with FARM disbudding requirements and were also more likely to use polled genetics. Respondents aged 18 to 34 and respondents with >60 calves were more likely to have made changes to their disbudding or dehorning protocol in the last decade. Although use of pain control was higher than in previous US studies, full adoption of pain management requires further extension efforts. Veterinarians appeared influential on adoption of pain control, and their involvement may encourage adoption of pain management. Further research should investigate how the implementation of new FARM v. 4.0 standards will change the disbudding and dehorning practices of American dairy producers.


Subject(s)
Horns , Nerve Block , Animals , Cattle , Dairying , Horns/surgery , Nerve Block/veterinary , Pain Management/veterinary , Wisconsin
18.
Prev Vet Med ; 195: 105472, 2021 Oct.
Article in English | MEDLINE | ID: mdl-34438246

ABSTRACT

Systematic reviews are a valuable tool for evaluating the efficacy of interventions and for quantifying associations. To be properly assessed, reviews must be comprehensively reported. The primary objective of this study was to evaluate the completeness of reporting of systematic reviews and meta-analyses in animal health. The secondary objective was to further characterize methods for literature searches and risk of bias assessments and to document whether the risk of bias component represented an assessment of risk of bias, study quality, or levels of evidence based on the primary studies included. The dataset comprised 91 systematic reviews or meta-analyses of interventions or exposures with at least one health outcome measured at the animal or animal byproduct level, in any companion or food animal species and published between 2014 and 2018. Two reviewers independently collected information on whether each item in the PRISMA reporting guidelines was reported, with disagreements resolved by consensus. There was considerable variability in the completeness of reporting among reviews; some items, such as eligibility criteria for inclusion, were reported in most reviews (>65 %). Other items were not consistently reported; for instance, in 60 % (54) of the reviews there was no information provided on the sample size of individual studies, populations, interventions and comparators, outcomes, or follow up period. Although 89 % (81) of systematic reviews with meta-analysis included the effect size estimate and confidence intervals, it was not possible to determine which study designs were included for 30 % (14) of reviews. Results from individual PRISMA item questions were combined to determine whether all aspects of each recommended item were reported; 71 % of items were adequately reported in less than half the systematic reviews without a meta-analysis, 35 % of the items were adequately reported in less than half the systematic reviews with a meta-analysis, and 71 % of items were adequately reported in less than half of the meta-analyses without a systematic review component. An assessment of individual study level bias was included in 64 % of the reviews, although this component included an evaluation of risk of bias (35 reviews), study quality (25 reviews), or levels of evidence based on study design (12 reviews). Reporting guidelines or clinical guidelines were inappropriately used to assess risk of bias in 9 reviews. Overall, the results of this study reveal that reporting of systematic reviews in the animal health literature is suboptimal and improvements are needed to enhance utility of these reviews.


Subject(s)
Research Design , Systematic Reviews as Topic/standards , Veterinary Medicine , Animals , Bias
19.
J Dairy Sci ; 104(9): 10143-10157, 2021 Sep.
Article in English | MEDLINE | ID: mdl-34099288

ABSTRACT

The objective of this scoping review was to characterize all available literature on modifiable management practices used during the dry period that have been evaluated for their effects on udder health in dairy cattle during the dry period and the subsequent lactation. Five databases and two conference proceedings were searched for relevant literature. Articles published in or after 1990 were eligible for inclusion. Eligible interventions or exposures were restricted to modifiable management practices; however, antimicrobial and teat sealant products were enumerated but not further characterized, as systematic reviews have been published on this topic. Other modifiable management practices were reported in 229 articles. Nutrition (n = 79), which included ration formulation and delivery (n = 44) and vitamin and mineral additives (n = 35), was the most commonly reported practice, followed by vaccines (n = 40) and modification of dry period length (n = 27). Risk of clinical mastitis (CM) was the most commonly reported outcome (n = 151); however, reporting of outcome risk periods varied widely between articles. Cure of existing intramammary infections (IMI) over the dry period (n = 40) and prevention of new IMI over the dry period (n = 54) were most commonly reported with a risk period between calving and 30 d in milk. Future systematic reviews with meta-analyses could target management practices such as nutrition, vaccines, and dry period length to quantify their effects on improving udder health during the dry period and early lactation. However, the variation in reporting of time at risk for CM and other outcomes challenges the ability of future synthesis work to inform management decisions on the basis of efficacy to cure or prevent IMI and CM. Consensus on which core outcomes should be evaluated in mastitis research and the selection of consistent risk periods for specific outcomes in animal trials is imperative.


Subject(s)
Cattle Diseases , Mastitis, Bovine , Animals , Anti-Bacterial Agents/therapeutic use , Cattle , Cattle Diseases/drug therapy , Cell Count/veterinary , Female , Lactation , Mammary Glands, Animal , Mastitis, Bovine/drug therapy , Mastitis, Bovine/prevention & control , Milk
20.
J Dairy Sci ; 104(5): 5881-5897, 2021 May.
Article in English | MEDLINE | ID: mdl-33685706

ABSTRACT

The use of local anesthesia and a nonsteroidal anti-inflammatory drug (NSAID) can reduce indicators of pain and inflammation and encourage self-rewarding behavior in calves following disbudding. Although the use of sedation may be recommended as a best practice for disbudding, there is little research in this area. The objective of this study was to evaluate the effects of xylazine sedation in conjunction with a local anesthetic and an NSAID in calves undergoing cautery disbudding. One hundred twenty-two group-housed female and male Holstein calves fed milk with automated feeders, aged 13 to 44 d, were enrolled over 9 replicates and randomly allocated to 1 of 2 treatments: (1) sedated: lidocaine cornual nerve block, 0.5 mg/kg meloxicam (administered subcutaneously) and 0.2 mg/kg xylazine (administered intramuscularly), or (2) nonsedated: lidocaine cornual nerve block and meloxicam. Outcomes collected consisted of feeding behavior (collected using automated milk feeders), latency to drink milk following disbudding, play behavior (induced by adding bedding), lying behavior, mechanical nociceptive threshold (MNT, measured using a pressure force algometer), struggling behavior during disbudding, length of time to administer the nerve block, length of time to disbud, and serum haptoglobin concentrations. Data were analyzed using mixed models with a fixed effect for baseline values and a random effect for trial replicate. Linear regression was used to assess continuous outcomes, logistic regression for binary outcomes, and Poisson and negative binomial models for count data with negative binomial models used if the over dispersion term was significant. There were no detected differences between the treatment groups in mean daily milk consumption in the 72-h following disbudding. Sedated calves had reduced average milk drinking speed from 0 to 24 h and 24 to 48 h following disbudding compared with nonsedated calves, but no difference was detected from 48 to 72 h. Sedated calves had reduced MNT at 0, 60, and 240 min after disbudding, but no differences were detected between groups at 24 h after disbudding. Nonsedated calves had 4.5 times the odds (95% CI: 1.5-13.2) of struggling more than twice during the disbudding procedure compared with sedated calves, and it took less time to administer a nerve block to sedated calves compared with nonsedated. At +3 h, nonsedated calves were 79 times (95% CI: 22.4 to 279.2) more likely to play compared with sedated calves, and 24 h after disbudding, sedated calves were 2 times more likely to play compared with nonsedated calves (95% CI: 0.93-4.3). The results indicate that calves sedated with xylazine for cautery disbudding responded less to painful stimuli (disbudding and MNT) both during and following the procedure and had a higher rate of play behavior 24 h following sedation compared with the nonsedated calves, but xylazine may also have a prolonged carryover effect that affects suckling behavior for 48 h following sedation.


Subject(s)
Horns , Xylazine , Anesthetics, Local , Animals , Cattle , Cautery/veterinary , Female , Horns/surgery , Iron , Male , Xylazine/pharmacology
SELECTION OF CITATIONS
SEARCH DETAIL
...